In this paper, we explore the bio sensory headset EMOTIV EPOC+ in the context of brain computer interfaces which record the brain signals and convert them into keystrokes. Reinforced by programming language constructs, these signals were able to trigger the movements of the finch robot and switch lightbulb objects. The observation and analysis of our case studies using human subjects using these brain computer interfaces shows two problems, subject's frustration and the time-consuming of learning to train the device. More experiments should be conducted in the future on different types of devices to explore the match between the brain signals and the actual actions. Also, investigate the experience of the users in learning to design and develop the brain computer interfaces.